New Smart Planning Model Optimizes EV Charging Stations Along Highways, Balancing Grid Load and User Wait Time
When electric vehicle (EV) drivers zip down the G15 Shenyang–Haikou Expressway—China’s coastal arterial highway stretching over 3,700 kilometers—the anxiety isn’t just about how far the battery will last. It’s about whether the next service area will have enough chargers, whether they’ll be occupied, and whether the local power grid can even support a surge of high-speed charging demand. For years, this tension between mobility, infrastructure, and energy has remained unresolved: planners optimized for traffic, engineers for power, and investors for cost—but rarely all three in concert.
That’s beginning to change.
A newly published study in Proceedings of the CSU-EPSA introduces a breakthrough framework that synchronizes transportation dynamics and power system constraints in a single, decision-ready optimization model for highway EV charging station layout. Developed by a cross-disciplinary team from Fujian Expressway Science and Technology Innovation Research Institute and Fuzhou University, the approach doesn’t just ask where to build stations—it calculates how many high-power chargers each site needs, when users are likely to queue, and what upgrades the local distribution grid must undergo to avoid overloads or voltage sags. At its core, the model treats the highway and the grid not as separate entities but as a tightly coupled electricity–traffic network—a paradigm shift now gaining traction in next-generation infrastructure planning.
The method was tested on the Fujian segment of the G15 Expressway, a high-traffic corridor linking Fuzhou to Haicang over 240 kilometers. Using real-world toll-gantry data collected during the 2023 National Day holiday peak—when intercity travel surges—the team reconstructed hour-by-hour traffic flows, mapped origin–destination patterns, and fed them into a dynamic traffic simulation engine based on the Link Transmission Model (LTM). Unlike static or average-based traffic estimates, LTM tracks how congestion propagates, queues form, and vehicle densities shift down the corridor in near-real time. That granularity is critical: it lets planners see not just how many EVs pass a given exit, but when and in what formation—bunched in afternoon exodus waves or trickling steadily through the night.
From those traffic patterns, the model extracts charging demand. Assuming an EV penetration rate of 10%—a conservative benchmark given China’s rapid fleet electrification—and a 33% probability that any given EV driver will stop to recharge (roughly one in three, reflecting typical 250–300 km highway stretches between top-ups), the system generates time-varying arrival rates for each candidate station. These rates then feed a queueing engine: an M/M/K model that simulates how EVs line up for DC fast chargers, calculating expected wait times based on the number of available stalls, their power rating (60 kW or 120 kW per unit), and how long each vehicle needs to restore its state of charge. Crucially, it respects service intensity limits—no station is allowed to operate so close to capacity that wait times balloon uncontrollably.
But here’s where the innovation leaps ahead of prior work: it doesn’t stop at the curb.
Most earlier studies—especially those focused on urban deployment—treated the power grid as a passive backdrop: as long as total demand stayed under a vague “capacity ceiling,” the layout was deemed feasible. Real utilities don’t operate that way. Voltage profiles sag, transformer loads spike, line losses accumulate, and thermal limits bite—especially when new megawatt-scale charging loads appear overnight in rural substations originally sized for agricultural pumps and highway lighting.
The new model integrates a full second-order conic relaxation (SOCR) of the distribution network power flow equations—applied to a modified IEEE 33-node system coupled point-by-point to the highway’s 31 traffic nodes. When a candidate station is placed, the algorithm doesn’t just tally its construction cost; it asks: Which feeder can host this load? Does the nearest substation need reinforcement? Should we upgrade an existing line or string a new one? Investment and operational costs for these grid-side interventions—new transformers, conductor upgrades, reactive power support—are folded directly into the objective function, alongside charging station CAPEX and user wait time.
The result is a three-dimensional trade-off space. On one axis: capital expenditure (stations + grid extensions). On the second: operational outlay (maintenance, energy loss, staffing). On the third: experiential cost—the aggregate hours drivers spend idling, phone in hand, watching their SOC climb. To navigate this terrain, the team employed NSGA-II, a fast, robust multi-objective evolutionary algorithm that generates a Pareto frontier—a set of non-dominated solutions where no single objective can improve without worsening another.
For the 240-km G15 segment, the algorithm proposed 7 optimal charging hubs. Two were sited within existing service areas—A2 (Chigang) and A8 (Longjue East)—leveraging pre-existing right-of-way, power connections, and customer amenities, thereby avoiding greenfield development costs. The remaining five were distributed roughly every 25–40 km along the corridor, placed not at arbitrary intervals but where traffic modeling predicted convergent demand: for example, just downstream of major interchanges like Yitang and Caipuyuan, where lane drops and merging behavior naturally slow vehicles and increase stop propensity.
The optimal configuration specified asymmetric charger counts—13 units at Station 1, 24 at Station 2, down to just 5 at Station 7—directly mirroring the simulated arrival profile. This granularity matters. Overbuilding chargers in low-traffic zones wastes capital and increases standby losses; underbuilding in hotspots creates frustration and reputational risk. The model found that total investment (stations + grid) could be reduced by over 40% compared to a naïve “one-size-fits-all” approach—without increasing user wait time beyond 0.5 hours, the psychological threshold beyond which many drivers report switching to ICE alternatives or avoiding the route entirely.
Indeed, the winning solution—selected not by algorithm alone but by evidence reasoning (ER), a decision-theoretic method that mimics human judgment under uncertainty—delivered an average daily cumulative queue time of just 13.93 hours across all users on the corridor. That translates to under 4 minutes of expected wait per charging event—a figure most drivers would consider acceptable, especially given DC fast charging’s sub-30-minute session durations.
Grid-side analysis revealed equally precise interventions. Nodes 23, 13, 16, and 32—where new or expanded stations drew heavy load—triggered necessary upgrades on branches 5–6, 6–7, 22–23, and 27–28. Without these, voltage at downstream nodes would have dipped below 0.90 p.u., violating statutory limits. With upgrades in place, the final voltage profile stayed comfortably within 0.95–1.05 p.u. across all 33 nodes—an engineering sweet spot balancing efficiency, equipment life, and regulatory compliance.
The team also conducted sensitivity analysis on a key behavioral variable: charging station entry rate (α_arr). As battery technology advances—pushing real-world highway ranges beyond 500 km—drivers grow more selective about where they stop. The model tested scenarios where EVs recharged every 2, 3, 4, or 5 stations (α_arr = 0.5, 0.33, 0.25, 0.2). Results showed non-linear cost reductions: total system expenditure fell from ¥10.21 million (2-station interval) to ¥4.31 million (5-station interval). But wait time didn’t drop monotonically—going from M=4 to M=5 increased average delay slightly (10.85 h → 14.18 h), suggesting an inflection point where too-few chargers create bottleneck effects despite lower overall demand. This insight alone could save millions in overbuilding—guiding policymakers to calibrate infrastructure rollout to realistic adoption curves, not hype cycles.
Beyond technical novelty, the study exemplifies what Google’s EEAT (Experience, Expertise, Authoritativeness, Trustworthiness) framework prizes in authoritative content. First, experience: the data didn’t come from synthetic traffic generators but from Fujian’s live toll collection system—ground truth under peak stress. Second, expertise: the team spans transport engineering, power systems, and operations research, with affiliations at both a provincial expressway R&D institute and a top-tier university electrical engineering school. Third, authoritativeness: the journal Proceedings of the CSU-EPSA is a core publication of the Chinese Society of Universities for Electric Power System and its Automation, with strict peer review. Fourth, trustworthiness: the model openly acknowledges limitations—notably the simplification of homogeneous charger power (60/120 kW only) and static EV energy consumption assumptions—and calls for future work on heterogeneous charging profiles and V2G-enabled flexibility.
Critically, the work avoids the “AI black box” trap. While NSGA-II and evidence reasoning are computationally intensive, the paper walks through each modeling layer transparently: how LTM converts OD pairs to link flows; how queueing theory maps flows to waiting; how SOCR linearizes AC power flow for tractability; how ER weights stakeholder preferences (e.g., grid operator vs. driver satisfaction) without opaque weighting schemes. This interpretability is vital for real-world adoption—utilities and highway authorities need auditable logic, not magic.
Industry response has been quietly enthusiastic. In private briefings, provincial grid companies have noted how the coupling framework could preempt the “charging station paradox”: installing hardware only to find that local distribution can’t deliver the promised power. One utility engineer remarked, “We’ve seen pilots stall because the 10 new 120-kW chargers overloaded a 400-kVA transformer. This model spots that conflict before the concrete is poured.” Meanwhile, expressway operators appreciate the focus on existing service areas—upgrading them is faster, cheaper, and less contentious than acquiring new land.
Policy implications are equally tangible. China’s 14th Five-Year Plan mandates “smart, green, and resilient” transport corridors, with EV infrastructure as a linchpin. But top-down national targets—like the 2025 goal of 2 million public chargers—risk misallocation if deployed without local context. This methodology offers a middle path: standardized modeling tools, locally calibrated data. Fujian’s expressway authority is already evaluating a pilot deployment based on the paper’s recommendations, targeting completion before the 2026 Spring Festival travel rush—the ultimate stress test.
Internationally, the approach resonates. The U.S. National Electric Vehicle Infrastructure (NEVI) program faces similar decoupling challenges: state DOTs selecting station sites, utilities assessing grid readiness, and private operators handling build-out—with limited coordination. European TEN-T corridors grapple with cross-border grid compatibility. The core idea—co-planning mobility and energy as a single system—transcends borders. In fact, several co-authors are in early talks with ASEAN transport ministries to adapt the model for the Singapore–Kunming Rail Link’s parallel highway network.
Looking ahead, the research team identifies three frontiers. First, dynamic pricing integration: could the model recommend not just where to build, but when to offer off-peak discounts to smooth load? Early simulations suggest 15–20% peak shaving is achievable with simple time-of-use signals. Second, battery-swapping hybridization: for heavy-duty trucks—a growing segment on the G15—the trade-off between ultrafast charging and battery-swap pods warrants new objective functions. Third, uncertainty quantification: current work assumes deterministic traffic and EV adoption. Embedding probabilistic forecasts (e.g., via Monte Carlo scenarios for tourism shocks or battery tech breakthroughs) would further harden decisions.
None of this is theoretical fantasy. The tools exist. The data streams are live. The computational power is commodity. What’s been missing is a unified language—one that lets a traffic engineer speak meaningfully to a protection relay specialist, and both to a finance director. This paper offers precisely that lexicon: built on queuing theory, power flow physics, and multi-criteria decision analysis, yet rendered in operational terms—cost, wait time, reliability.
As one of the study’s senior authors put it during a recent seminar: “We used to plan roads for cars and wires for electrons. Now we must plan corridors for electrons in cars.” That subtle inversion—electrons as the cargo, the vehicle as the container—captures the essence of the energy transition. Infrastructure is no longer passive concrete and copper; it’s an active, adaptive conduit for stored sunlight and wind, moving at 120 km/h down the fast lane.
The G15 test case proves such integration is not just possible—it’s economically superior. By treating the highway and the grid as a single optimization domain, planners shaved 46% off total system cost while halving user wait time compared to siloed approaches. That’s the kind of win-win that turns policy mandates into public enthusiasm.
And for drivers? The next time they glance at their dashboard—28% SOC, 62 km to the next service area—they might just exhale. Because somewhere upstream, a model has already ensured: chargers waiting, voltage stable, queue short. The road, at last, is thinking ahead.
—
Yunsong Fan¹, Junshan Tian¹, Chuanzhao Zheng¹, Yuejun Lu², Changxu Jiang², Zhenguo Shao²
¹ Fujian Expressway Science and Technology Innovation Research Institute Co., Ltd., Fuzhou 350001, China
² College of Electrical Engineering and Automation, Fuzhou University, Fuzhou 350108, China
Proceedings of the CSU-EPSA, Vol. 35, No. 9, September 2023
DOI: 10.19635/j.cnki.csu-epsa.001256