Item request has been placed! ×
Item request cannot be made. ×
loading  Processing Request

Human-Gen AI Co-Design: Exploring Factors Impacting Trust Calibration

Item request has been placed! ×
Item request cannot be made. ×
loading   Processing Request
  • Additional Information
    • Publication Information:
      ASME, 2025.
    • Publication Date:
      2025
    • Abstract:
      The process of generating ideas during co-design with aGenerative AI (GenAI) system requires the gradual calibrationof trust in that system. Trust plays a pivotal role in shapinghuman interactions with technology, and developingwell-calibrated trust is essential for the effective use andintegration of GenAI. Proper trust calibration helps preventunderutilization of the system’s capabilities and dissatisfactionwith its output. For engineers and system designers, trust isparticularly important as it directly influences user responses,system adoption, and overall engagement with newtechnologies. To explore the factors that influence trustfluctuation when co-designing with a GenAI system, weanalyzed 12 hours of conceptual human-AI co-design sessionsusing a custom GenAI system capable of producing imagesacross various generation modes from convergent-divergent toabstract-concrete, and combining text and sketch prompting.Focussing on each moment of interaction withGenAI-generated images, we conducted an incremental andqualitative coding of each trust-related extract from think-aloudprotocols. Through this approach, we identified 23 key factorsthat cause fluctuations in trust. Our findings reveal a complexnetwork of factors that impact trust calibration, offeringinsights into how GenAI systems can be designed to facilitatefaster and more effective trust-building in human-GenAIcollaborations.
    • Rights:
      open access
      http://purl.org/coar/access_right/c_abf2
      info:eu-repo/semantics/openAccess
    • Accession Number:
      edsorb.331279