HMI Challenges In Shared Control Between Driver And ADAS

HMI Challenges In Shared Control Between Driver And ADAS

Hello guys, welcome back to our blog. In this article, I will discuss HMI challenges in shared control between driver and ADAS, and tools & technologies in HMI development.

Ask questions if you have any electrical,  electronics, or computer science doubts. You can also catch me on Instagram – CS Electrical & Electronics

HMI Challenges In Shared Control Between Driver And ADAS

The rapid advancement of Advanced Driver Assistance Systems (ADAS) is reshaping the driver’s role in modern vehicles. As we transition from driver-only control to increasing levels of vehicle autonomy, the line between human and machine input becomes blurred. This has created a new set of challenges, especially in the domain of Human-Machine Interfaces (HMI).

In a shared control scenario, both the driver and ADAS have influence over vehicle behavior. This relationship requires a seamless, intuitive, and safe interaction model—one that is often complex to design. Poor HMI design can lead to confusion, over-reliance, reduced trust, and even accidents.

This article explores the intricacies of HMI in the shared control era: the challenges, human factors, design considerations, real-world examples, and how the industry is tackling them.

Understanding Shared Control in ADAS

What is Shared Control?

Shared control refers to a driving paradigm where both the driver and vehicle are jointly responsible for operating the car. This typically exists in Level 1 to Level 3 automation as defined by SAE.

Examples include:

  • Lane Keeping Assist (LKA) – The system controls steering, but the driver must be alert.
  • Adaptive Cruise Control (ACC) – Vehicle manages speed and distance, but the driver retains overall control.
  • Highway Pilot Systems – Vehicle drives autonomously on highways but may request driver intervention.

Why Is HMI Important in Shared Control?

In shared control, communication between the system and the human becomes critical. The driver needs to know:

  • What the system is doing
  • What it is planning to do
  • When it expects the driver to take over
  • Whether the system is functioning properly

A well-designed HMI can ensure that this interaction is safe, intuitive, and trustworthy.

Core HMI Challenges in Shared Control

01. Mode Confusion

Problem: Drivers may not always be aware of which system mode the vehicle is in—manual, partial assist, or fully automated.

    Impact:

    • Delay in reaction during handover
    • Over-reliance on the system
    • Misunderstanding system limitations

    Example: In Tesla Autopilot or similar systems, some users assume “full self-driving” capability, leading to dangerous misuse.

    02. Trust Calibration

    Problem: Over-trust or under-trust in ADAS features.

      • Over-trust leads to distraction and disengagement.
      • Under-trust results in refusal to use assistive features.

      HMI Role:

      • Build trust with transparency
      • Offer clear feedback on what the system sees, plans, and expects

      03. Takeover Requests

      Problem: When the system detects a scenario beyond its capabilities, it sends a takeover request (TOR). The challenge is ensuring the driver responds quickly and correctly.

        Factors Affecting TOR Success:

        • Driver attentiveness
        • Situation urgency
        • Time-to-takeover (TTTO)
        • HMI modalities (visual, auditory, haptic)

        Key HMI Questions:

        • How much lead time should be given?
        • What type of alert is most effective?
        • Is the request context-aware?

        04. Driver State Monitoring

        Problem: Ensuring that the driver is alert and ready to take control.

          Solution: HMI systems integrated with Driver Monitoring Systems (DMS) using:

          • Eye-tracking
          • Head pose estimation
          • Steering behavior
          • Biometric sensors

          Challenge:

          • Avoiding false positives/negatives
          • Privacy concerns

          05. Information Overload

          Problem: Displaying too much data can overwhelm the driver, while too little can leave them clueless.

            HMI Balance:

            • Prioritize information
            • Use hierarchical layers (what’s important now vs. later)
            • Adapt content based on context

            Example: Augmented Reality (AR) HUDs showing lane assist or navigation must not clutter the driver’s view.

            06. Multimodal Communication Conflicts

            Problem: Using multiple feedback methods (visual, audio, haptic) simultaneously can lead to conflicting or ignored cues.

              Design Principle:

              • Ensure modalities are complementary
              • Reduce cognitive load
              • Customize feedback based on user preference or situation

              07. Cultural and Demographic Variability

              Problem: Perception of automation and feedback differs across cultures and age groups.

                HMI Consideration:

                • Adaptive interfaces
                • Localized user experience
                • Training and education embedded in HMI

                08. Non-Deterministic Behavior of AI Systems

                Problem: ADAS decisions based on AI models may not be easily explainable or predictable to humans.

                  Need for Explainable AI (XAI) in HMI:

                  • Show why a system made a certain decision (e.g., sudden braking)
                  • Increase driver confidence and understanding

                  Design Principles for Effective HMI in Shared Control

                  01. Predictability: The system’s next action should be understandable to the driver.

                  02. Consistency: Feedback patterns should remain consistent across scenarios.

                  03. Intuitiveness: Use familiar metaphors (e.g., green = safe, red = warning).

                  04. Redundancy: Use multiple channels (visual + audio + haptic) to reinforce critical messages.

                  05. Personalization: Allow user preferences for alert sensitivity, visual themes, or sound cues.

                  Tools and Technologies in HMI Development

                  01. Eye Tracking Systems: Used for attention monitoring and gaze-based feedback.

                  02. Head-Up Displays (HUDs): Visual information projected onto the windshield to keep eyes on the road.

                  03. Natural Language Interfaces (NLI): Voice assistants for safe interaction without using hands.

                  04. Haptic Feedback: Steering wheel vibrations, seat rumble for alerts.

                  05. Augmented Reality (AR): Real-time environmental overlay for navigation or hazard indication.

                    Real-World Examples

                    01. Tesla Autopilot:

                    • Visual feedback on screen, limited audio alerts.
                    • Criticized for insufficient TORs.

                    02. GM Super Cruise

                    • Uses eye-tracking for driver engagement.
                    • Lane-centering, adaptive cruise, and hands-free on mapped highways.

                    03. BMW iDrive 8

                    • Combines touch, voice, and gesture control.
                    • Integrates AR navigation and driver alerts.

                    04. Volvo Pilot Assist

                    • Focuses heavily on TOR management.
                    • Uses both haptic and visual cues to maintain driver readiness.

                      Regulatory and Safety Standards

                      • ISO 21448 (SOTIF) – Safety of the Intended Functionality
                      • UNECE Regulation 157 – For Automated Lane Keeping Systems (ALKS)
                      • Euro NCAP – Rates ADAS usability and handover performance
                      • SAE J3016 – Levels of Driving Automation

                      These frameworks are pushing OEMs to improve shared-control HMI designs for transparency, trust, and safety.

                      Emerging Trends and Future Outlook

                      01. Emotion-Aware HMI: Systems that adapt to driver emotions using voice tone, facial expression.

                      02. AI-Driven Interfaces: ML models that adapt HMI behavior based on driver habits.

                      03. Brain-Computer Interfaces (BCI): Experimental technology where driver intentions are decoded via EEG.

                      04. Cloud-Connected HMIs: Receive updates, learn from shared fleet data, adjust to real-time road conditions.

                      05. Personalized HMI Profiles: Linked to driver identity, preferences, and behavioral data.

                        Conclusion

                        As we progress toward higher levels of vehicle automation, the Human-Machine Interface becomes a critical point of success or failure in shared control systems. HMI is no longer just about buttons and screens—it’s about cognitive alignment, trust, timing, and communication between human and machine.

                        Successfully addressing the challenges in shared control will not only ensure safety but also accelerate the adoption of ADAS and future autonomous systems. The goal is not just machine intelligence but collaborative intelligence between driver and vehicle.

                        This was about “HMI Challenges In Shared Control Between Driver And ADAS“. Thank you for reading.

                        Also, read:

                          HMI #ADAS #SharedControl #AutonomousVehicles #DriverAssistance #HumanFactors #CarTech #UXDesign #VehicleHMI #DriverMonitoring #AutomotiveUX #VoiceInterface #AugmentedReality #SelfDrivingCars #MultimodalFeedback #AutomotiveSafety #CarInterface #FutureMobility #TrustInTech #HumanCenteredDesign

                          About The Author

                          Share Now