ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2303.10131
  4. Cited By
She Elicits Requirements and He Tests: Software Engineering Gender Bias
  in Large Language Models

She Elicits Requirements and He Tests: Software Engineering Gender Bias in Large Language Models

17 March 2023
Christoph Treude
Hideaki Hata
ArXivPDFHTML

Papers citing "She Elicits Requirements and He Tests: Software Engineering Gender Bias in Large Language Models"

4 / 4 papers shown
Title
A Comprehensive Analysis of Large Language Model Outputs: Similarity, Diversity, and Bias
A Comprehensive Analysis of Large Language Model Outputs: Similarity, Diversity, and Bias
Brandon Smith
Mohamed Reda Bouadjenek
Tahsin Alamgir Kheya
Phillip Dawson
S. Aryal
ALM
ELM
26
0
0
14 May 2025
How Do Generative Models Draw a Software Engineer? A Case Study on Stable Diffusion Bias
How Do Generative Models Draw a Software Engineer? A Case Study on Stable Diffusion Bias
Tosin Fadahunsi
Giordano dÁloisio
A. Marco
Federica Sarro
DiffM
66
0
0
15 Jan 2025
A Catalog of Fairness-Aware Practices in Machine Learning Engineering
A Catalog of Fairness-Aware Practices in Machine Learning Engineering
Gianmario Voria
Giulia Sellitto
Carmine Ferrara
Francesco Abate
A. Lucia
F. Ferrucci
Gemma Catolino
Fabio Palomba
FaML
39
3
0
29 Aug 2024
A Survey on Bias and Fairness in Machine Learning
A Survey on Bias and Fairness in Machine Learning
Ninareh Mehrabi
Fred Morstatter
N. Saxena
Kristina Lerman
Aram Galstyan
SyDa
FaML
323
4,212
0
23 Aug 2019
1