Sony AI released a dataset that tests the fairness and bias of AI models. It's called the Fair Human-Centric Image Benchmark ...
Tech Xplore on MSN
Human-centric photo dataset aims to help spot AI biases responsibly
A database of more than 10,000 human images to evaluate biases in artificial intelligence (AI) models for human-centric ...
Continuing on its open source tear, Meta today released a new AI benchmark, FACET, designed to evaluate the “fairness” of AI models that classify and detect things in photos and videos, including ...
Results that may be inaccessible to you are currently showing.
Hide inaccessible results