1

GBDF: Gender Balanced DeepFake Dataset Towards Fair DeepFake Detection

Facial forgery by deepfakes has raised severe societal concerns. Several solutions have been proposed by the vision community to effectively combat the misinformation on the internet via automated deepfake detection systems. Recent studies have …

Deep Generative Views to Mitigate Gender Classification Bias Across Gender-Race Groups

Published studies have suggested the bias of automated face-based gender classification algorithms across gender-race groups. Specifically , unequal accuracy rates were obtained for women and dark-skinned people. To mitigate the bias of gender …

Is Facial Recognition Biased at Near-Infrared Spectrum as Well?

Published academic research and media articles suggest face recognition is biased across demographics. Specifically, unequal performance is obtained for women, dark-skinned people, and older adults. However, these published studies have examined the …

Investigating Fairness of Ocular Biometrics Among Young, Middle-Aged, and Older Adults

A number of studies suggest bias of the face biometrics, i.e., face recognition and soft-biometric estimation methods, across gender, race, and age-groups. There is a recent urge to investigate the bias of different biometric modalities toward the …