Clinical decision support systems (CDSS) are computer-based applications used to analyze data within electronic health records (EHRs). CDS algorithms are progressively being integrated into healthcare systems to expand patient care. However, research and development in ethical frameworks have uncovered that CDS applications can perpetuate bias in healthcare. A recent EHR quality improvement study has revealed significant differences in family history accessibility, availability, and comprehensiveness based on sex, race and ethnicity, and language preference. These findings propose that historically medically underserved populations are excluded from identification from CDS tools based on family history information, unintentionally reinforcing existing healthcare disparities and potentially creating more disparities in healthcare systems.